Pegasos: primal estimated sub-gradient solver for SVM
نویسندگان
چکیده
منابع مشابه
: Primal Estimated sub - GrAdient SOlver for SVM
We describe and analyze a simple and effective stochastic sub-gradient descent algorithm for solving the optimization problem cast by Support Vector Machines (SVM). We prove that the number of iterations required to obtain a solution of accuracy is Õ(1/ ), where each iteration operates on a single training example. In contrast, previous analyses of stochastic gradient descent methods for SVMs r...
متن کاملGADGET SVM: a Gossip-bAseD sub-GradiEnT SVM solver
Distributed environments such as federated databases, wireless and sensor networks, Peer-to-Peer (P2P) networks are becoming increasingly popular and wellsuited for machine learning since they can store large quantities of data on a network. The distributed setting is complex in part because network topologies are often dynamic and data available to algorithms changes frequently. Furthermore, i...
متن کاملNew Primal SVM Solver with Linear Computational Cost for Big Data Classifications
Support Vector Machines (SVM) is among the most popular classification techniques in machine learning, hence designing fast primal SVM algorithms for large-scale datasets is a hot topic in recent years. This paper presents a new L2norm regularized primal SVM solver using Augmented Lagrange Multipliers, with linear computational cost for Lp-norm loss functions. The most computationally intensive...
متن کاملThe Stochastic Gradient Descent for the Primal L1-SVM Optimization Revisited
We reconsider the stochastic (sub)gradient approach to the unconstrained primal L1-SVM optimization. We observe that if the learning rate is inversely proportional to the number of steps, i.e., the number of times any training pattern is presented to the algorithm, the update rule may be transformed into the one of the classical perceptron with margin in which the margin threshold increases lin...
متن کاملA primal sub-gradient method for structured classification with the averaged sum loss
We present a primal sub-gradient method for structured SVM optimization defined with the averaged sum of hinge losses inside each example. Compared with the mini-batch version of the Pegasos algorithm for the structured case, which deals with a single structure from each of multiple examples, our algorithm considers multiple structures from a single example in one update. This approach should i...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematical Programming
سال: 2010
ISSN: 0025-5610,1436-4646
DOI: 10.1007/s10107-010-0420-4